Information Theory On extrinsic information of good binary codes operating over Gaussian channels
نویسندگان
چکیده
We show that the extrinsic information about the coded bits of any good (capacity achieving) binary code operating over a Gaussian channel is zero when the channel capacity is lower than the code rate and unity when capacity exceeds the code rate, that is, the extrinsic information transfer (EXIT) chart is a step function of the signal to noise ratio and independent of the code. It follows that, for a common class of iterative receivers where the error correcting decoder must operate at first iteration at rate above capacity (such as in turbo equalization, iterative channel estimation, parallel and serial concatenated coding and the like), classical good codes which achieve capacity over the Additive White Gaussian Noise Channel are not effective and should be replaced by different new ones. Copyright © 2006 AEIT
منابع مشابه
On Extrinsic Information of Good Codes Operating Over Discrete Memoryless Channels
We show that the Extrinsic Information about the coded bits of any good (capacity achieving) code operating over a wide class of discrete memoryless channels (DMC) is zero when channel capacity is below the code rate and positive constant otherwise, that is, the Extrinsic Information Transfer (EXIT) chart is a step function of channel quality, for any capacity achieving code. It follows that, f...
متن کاملAnalyzing the turbo decoder using the Gaussian approximation
In this paper, we introduce a simple technique for analyzing the iterative decoder that is broadly applicable to different classes of codes defined over graphs in certain fading as well as additive white Gaussian noise (AWGN) channels. The technique is based on the observation that the extrinsic information from constituent maximum a posteriori (MAP) decoders is well approximated by Gaussian ra...
متن کاملImproved low-density parity-check codes using irregular graphs
We construct new families of error-correcting codes based on Gallager’s low-density parity-check codes. We improve on Gallager’s results by introducing irregular parity-check matrices and a new rigorous analysis of hard-decision decoding of these codes. We also provide efficient methods for finding good irregular structures for such decoding algorithms. Our rigorous analysis based on martingale...
متن کاملOn the Distribution of Mutual Information
In the early years of information theory, mutual information was defined as a random variable, and error probability bounds for communication systems were obtained in terms of its probability distribution. We advocate a return to this perspective for a renewed look at information theory for general channel models and finite coding blocklengths. For capacityachieving inputs, we characterize the ...
متن کاملGood Error-Correcting Codes Based On Very Sparse Matrices - Information Theory, IEEE Transactions on
We study two families of error-correcting codes defined in terms of very sparse matrices. “MN” (MacKay–Neal) codes are recently invented, and “Gallager codes” were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The decoding of both codes can be tackled with a practical sum-product algorithm. We prove that these codes are “very good...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- European Transactions on Telecommunications
دوره 18 شماره
صفحات -
تاریخ انتشار 2007